A New Oscillating-Error Technique for Classifiers

نویسنده

  • Kieran Greer
چکیده

This paper describes a new method for reducing the error in a classifier. It uses a weight adjustment update, but includes the very simple rule of either adding or subtracting the adjustment, based on whether the data point is currently larger or smaller than the desired value, and on a pointby-point basis. This gives added flexibility to the convergence procedure, where through a series of transpositions, values far away can continue towards the desired value, whereas values that are originally much closer can oscillate from one side to the other. Tests show that the method can successfully classify some known datasets. It can also work in a batch mode, with reduced training times and can be used as part of a neural network, or classifiers in general. There are also some updates on an earlier wave shape paper.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new two-step Obrechkoff method with vanished phase-lag and some of its derivatives for the numerical solution of radial Schrodinger equation and related IVPs with oscillating solutions

A new two-step implicit linear Obrechkoff twelfth algebraic order method with vanished phase-lag and its first, second, third and fourth derivatives is constructed in this paper. The purpose of this paper is to develop an efficient algorithm for the approximate solution of the one-dimensional radial Schrodinger equation and related problems. This algorithm belongs in the category of the multist...

متن کامل

An Improved Oscillating-Error Classifier with Branching

This paper extends the earlier work on an oscillating error correction technique. Specifically, it extends the design to include further corrections, by adding new layers to the classifier through a branching method. This technique is still consistent with earlier work and also neural networks in general. With this extended design, the classifier can now achieve the high levels of accuracy repo...

متن کامل

A New Hybrid Framework for Filter based Feature Selection using Information Gain and Symmetric Uncertainty (TECHNICAL NOTE)

Feature selection is a pre-processing technique used for eliminating the irrelevant and redundant features which results in enhancing the performance of the classifiers. When a dataset contains more irrelevant and redundant features, it fails to increase the accuracy and also reduces the performance of the classifiers. To avoid them, this paper presents a new hybrid feature selection method usi...

متن کامل

A New Technique for Combining Multiple Classifiers using The Dempster-Shafer Theory of Evidence

This paper presents a new classifier combination technique based on the DempsterShafer theory of evidence. The Dempster-Shafer theory of evidence is a powerful method for combining measures of evidence from different classifiers. However, since each of the available methods that estimates the evidence of classifiers has its own limitations, we propose here a new implementation which adapts to t...

متن کامل

Application of ensemble learning techniques to model the atmospheric concentration of SO2

In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1505.05312  شماره 

صفحات  -

تاریخ انتشار 2015